AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Japanese Dependency Parsing

# Japanese Dependency Parsing

Bert Base Japanese Wikipedia Ud Head
This is a BERT model specifically designed for Japanese dependency parsing, used to detect head words in long unit words, implemented in a question-answering format.
Sequence Labeling Transformers Japanese
B
KoichiYasuoka
474
1
Deberta Base Japanese Aozora Ud Head
A DeBERTa(V2) model pre-trained on Aozora Bunko for Japanese dependency parsing (head detection of long unit words), using a question-answering task format.
Sequence Labeling Transformers Japanese
D
KoichiYasuoka
707
0
Transformers Ud Japanese Electra Base Ginza 510
MIT
Japanese pretrained model based on ELECTRA architecture, pretrained on approximately 200 million Japanese sentences from the mC4 dataset and fine-tuned on UD_Japanese_BCCWJ corpus
Sequence Labeling Transformers Japanese
T
megagonlabs
7,757
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase